# Table Question Answering

Fintabqa
MIT
A financial table question answering model based on the LayoutLM architecture, specifically designed to extract and answer structured questions from financial tables.
Question Answering System Transformers English
F
ethanbradley
128
0
Omnitab Large
OmniTab is a table question answering model based on the BART architecture, achieving few-shot table QA capabilities through pre-training with natural and synthetic data
Question Answering System Transformers English
O
neulab
158
2
Omnitab Large Finetuned Wtq
OmniTab is a table-based question answering model that achieves few-shot table QA capabilities through pre-training on natural and synthetic data.
Question Answering System Transformers English
O
neulab
55
7
Tapex Base Finetuned Wtq
MIT
TAPEX is a table pre-training model that learns through neural SQL executors, specifically designed for table question answering tasks.
Question Answering System Transformers English
T
microsoft
162
11
Tapex Large
MIT
TAPEX is a model designed for table reasoning tasks, learning through neural SQL executors for table pretraining, based on the BART architecture.
Large Language Model Transformers English
T
microsoft
252
9
Tapex Large Finetuned Wikisql
MIT
TAPEX is a table pre-training model learned through neural SQL executors, based on the BART architecture, specifically designed for table reasoning tasks.
Large Language Model Transformers English
T
microsoft
676
16
Tapas Small Masklm
TAPAS (Table Parser) is a table-based pre-trained language model developed by Google Research, specifically designed for processing tabular data and natural language queries.
Large Language Model Transformers
T
google
14
1
Tapas Small Finetuned Sqa
Apache-2.0
This model is a small version of TAPAS, which has undergone intermediate pre-training and fine-tuning on the SQA dataset. It is suitable for table question answering tasks in dialogue scenarios.
Question Answering System Transformers English
T
google
759
1
Tapex Base Finetuned Wikisql
MIT
TAPEX is a model designed for table question answering tasks, learning table pre-training through neural SQL executors and based on the BART architecture.
Question Answering System Transformers English
T
microsoft
242
18
Tapex Large Finetuned Sqa
Apache-2.0
TAPEX-large is a large-scale language model pre-trained on tabular data, specifically fine-tuned for table question answering tasks. It achieves table understanding through a neural SQL executor and can answer natural language questions about table content.
Question Answering System Transformers English
T
nielsr
30
0
Tapas Small
Apache-2.0
TAPAS is a Transformer-based table question answering model pre-trained in a self-supervised manner on Wikipedia tables and associated text, supporting table understanding and question answering tasks.
Large Language Model Transformers English
T
google
41
0
Tapas Mini
Apache-2.0
TAPAS is a BERT-like model based on the Transformer architecture, specifically designed for processing tabular data and related text, pretrained in a self-supervised manner on Wikipedia table data.
Large Language Model Transformers English
T
google
15
0
Tapas Base Masklm
TAPAS (Table Parsing) is a pre-trained language model developed by Google specifically for handling table-related tasks.
Large Language Model Transformers
T
google
148
0
Tapas Tiny Masklm
TAPAS is a table-based pretrained language model specifically designed for tasks related to tabular data.
Large Language Model Transformers
T
google
16
0
Tapas Large Masklm
TAPAS is a pre-trained language model based on tabular data, specifically designed for natural language tasks related to tables.
Large Language Model Transformers
T
google
15
2
Tapas Tiny
Apache-2.0
TAPAS is a Transformer-based table question answering model pre-trained in a self-supervised manner on English Wikipedia table data, supporting table QA and entailment tasks.
Large Language Model Transformers English
T
google
44
0
Tapas Medium Finetuned Wikisql Supervised
Apache-2.0
TAPAS is a Transformer-based table question answering model, pre-trained in a self-supervised manner on English Wikipedia table data and fine-tuned with supervision on the WikiSQL dataset.
Question Answering System Transformers English
T
google
19
0
Tapas Base Finetuned Wikisql Supervised
Apache-2.0
TAPAS is a BERT-based Transformer model specifically designed for table question answering tasks. It is pre-trained in a self-supervised manner on English Wikipedia table data and supports weakly supervised table parsing.
Question Answering System Transformers English
T
google
737
9
Tapex Base
MIT
TAPEX is a table pre-training model that learns through a neural SQL executor, capable of handling table reasoning tasks.
Large Language Model Transformers English
T
microsoft
799
43
Tapas Base
Apache-2.0
A table understanding model based on BERT architecture, pretrained on Wikipedia table data through self-supervised learning, supporting table question answering and statement verification tasks
Large Language Model Transformers English
T
google
2,457
7
Tapas Base Finetuned Sqa
Apache-2.0
A table question answering model based on BERT architecture, enhanced with intermediate pretraining for numerical reasoning, fine-tuned on the SQA dataset
Question Answering System Transformers English
T
google
1,867
6
Tapas Medium
Apache-2.0
A table-based question answering model based on the Transformer architecture, pretrained in a self-supervised manner on English Wikipedia tables and associated text
Large Language Model Transformers English
T
google
23
0
Tapas Large Finetuned Sqa
Apache-2.0
This model is the large version of TAPAS, fine-tuned for sequential question answering (SQA) tasks, suitable for table-related question answering scenarios.
Question Answering System Transformers English
T
google
71
7
Tapas Mini Finetuned Sqa
Apache-2.0
The TAPAS mini model is a table question answering model that underwent intermediate pretraining and fine-tuning on the SQA dataset, utilizing relative position embedding technology.
Question Answering System Transformers English
T
google
24
4
Tapas Medium Masklm
TAPAS is a table-based pre-trained language model specifically designed for processing tabular data and related queries.
Large Language Model Transformers
T
google
14
1
Tapas Medium Finetuned Wtq
Apache-2.0
This model is a medium-sized table question answering model based on TAPAS architecture, fine-tuned on WikiTable Questions dataset, suitable for table data QA tasks.
Question Answering System Transformers English
T
google
77
2
Tapas Tiny Finetuned Wtq
Apache-2.0
TAPAS is a tiny Transformer model optimized for table question answering tasks, achieving table comprehension capabilities through intermediate pretraining and chained multi-dataset fine-tuning
Question Answering System Transformers English
T
google
1,894
1
Tapas Small Finetuned Wtq
Apache-2.0
This model is a small version of TAPAS, specifically fine-tuned on the WikiTable Questions dataset for table-based question answering tasks.
Question Answering System Transformers English
T
google
406
5
Tapas Large
Apache-2.0
TAPAS is a BERT-like model based on the Transformer architecture, specifically designed for processing tabular data and related text. It is pre-trained through self-supervised learning on a massive collection of English Wikipedia tables and associated text.
Large Language Model Transformers English
T
google
211
2
Tapas Base Finetuned Wtq
Apache-2.0
TAPAS is a Transformer-based table question answering model, pre-trained on Wikipedia table data through self-supervised learning and fine-tuned on datasets like WTQ.
Question Answering System Transformers English
T
google
23.03k
217
Tapas Mini Finetuned Wtq
Apache-2.0
This model is a mini version based on the TAPAS architecture, specifically fine-tuned for the WikiTable Questions (WTQ) dataset for table question answering tasks.
Question Answering System Transformers English
T
google
35
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase